seo

The 2 User Metrics That Matter for SEO

In the wake of Google’s Panda updates, there’s been a lot of fear regarding user metrics and how they impact SEO.  Many people are afraid that “bad” signals in analytics data, especially high bounce rates and low time-on-site, could potentially harm their rankings.

I don’t think Google is tapping into analytics data directly (I’ll defend that later), and I don’t think they have to. There are two user metrics that both Google and Bing have direct access to: (1) SERP CTR, and (2) “Dwell time”, and I think those two metrics can tell them a lot about your site.

The official word from Google is that analytics data is not used for ranking. Whether or not you believe that is entirely up to you, and I’m not here to argue about it. I’ll only say that it’s rare to hear Matt say something that emphatically.  I think the arguments against using analytics directly as a ranking factor are much more practical in nature…

(1) Not Everyone Uses GA

Usage stats for GA are tough to pin down, but a large 2009 study placed the adoption rate at about 28%. I’ve seen numbers as high as 40% being quoted, but it’s likely that somewhere around 2/3 of all sites don’t have GA data. It’s tough for Google to penalize or devalue a site based on a factor that only exists on 1/3 of all sites. Worse yet, some of the largest sites don’t have GA data, because those are the sites that can afford traditional, enterprise analytics (WebTrends, Omniture, etc.).

(2) GA Can Be Mis-installed

Even for sites using GA, Google can’t control how it’s installed. I can tell you from consulting and from Q&A here on SEOmoz that GA is often installed badly. This can elevate bounce rates, reduce time-on-site, and generally add a lot of noise to the system.

(3) GA Can Be Manipulated

Of course, there’s a malicious version of (2) – you can mis-install GA on purpose. There are ways to manipulate most user metrics, if you want to, and there’s no scalable way for Google to double-check everyone’s installation and setup. Once the GA tags are in your hands, they’ve lost a lot of control.

To be fair, others disagree and think that Google will use any data they can get their hands on. Some have even produced indirect evidence that bounce rate is in play. I’m going to argue a simple point – that Google and Bing don’t need analytics data or bounce rate. They have all the data they need from their own logs.

The 1 Reason I Don’t Buy

One argument you hear all the time is that Google can’t possibly use something like bounce rate as a ranking signal, because bounce rate is very site-dependent and unreliable by itself. I hear it so often that I wanted to take a moment to say that I don’t buy this argument, for one simple reason. ANY ranking signal, by itself, is unreliable. I don’t know a single SEO who would argue that TITLE tags don’t matter, for example, and yet TITLE tags are incredibly easy to manipulate. On-page factors in general can be spammed – that’s why Google added links to the mix. Links can be spammed – that’s why they’re adding social metrics and user metrics. With over 200 rankings factors (Bing claims over 1,000), no single factor has to be perfect.

The first metric I think Google makes broad use of is direct Click-Through Rate (CTR) from the SERPs themselves. Whether or not a result gets clicked on is one of Google’s and Bing’s first clues about whether any given result is a good match to a query. We know Google and Bing both have this data, because they directly report it to us.

In Google Webmaster Tools, you can find CTR data under “Your site on the web” > “Search queries”. It looks something like this:

Google Webmaster Tools screenshot

Bing reports similar data – from the “Dashboard”, click on “Traffic Summary”:

Bing Webmaster Tools screenshot

Of course, we also know that Google factors CTR heavily into their paid search quality score, and Bing has followed suit over the past year. While the paid search algorithm is very different from organic search, it stands to reason that they value CTR. Relevant results drive more clicks.

Last year, Bing’s Duane Forrester wrote a post called “How to Build Quality Content”, and in it he referenced something called “dwell time”:

Your goal should be that when a visitor lands on your page, the content answers all of their needs, encouraging their next action to remain with you.  If your content does not encourage them to remain with you, they will leave.  The search engines can get a sense of this by watching the dwell time.  The time between when a user clicks on our search result and when they come back from your website tells a potential story.  A minute or two is good as it can easily indicate the visitor consumed your content.  Less than a couple of seconds can be viewed as a poor result.

Dwell time, in a sense, is an amalgam of bounce rate and time-on-site metrics – it measures how long it takes for someone to return to a SERP after clicking on a result (and it can be measured directly from the search engine’s own data).

Google hasn’t been quite so transparent, but there’s one piece of evidence that suggests strongly to me that they use dwell time as well (or something very similar). Last year, Google tested a feature where, if you clicked a listing and then quickly came back to the SERP (i.e. your dwell time was very low), you would get the option to block that site:

Screenshot of Google's block site option

This feature isn’t currently available for all users – Google has temporarily scaled back site blocking with the launch of social personalization. The fact that low dwell time triggered the ability to block a site, though, clearly shows Google is factoring in dwell time as a quality signal.

Where these 2 metrics really shine is as a duo. CTR by itself can easily be manipulated – you can drive up clicks with misleading titles and META descriptions that have little relevance to your landing page. That kind of manipulation will naturally lead to low dwell time, though. If you artificially drive up CTR and then your site doesn’t fulfill the promise of the snippet, people will go back to the SERPs. The combo of CTR and dwell time is much more powerful and, with just 2 metrics, removes a lot of quality issues. If you have both high CTR and high dwell time, you’re almost always going to have a quality, relevant result.

I’m not suggesting that bounce rate and other user metrics don’t matter. As I said, dwell time is connected (and probably well correlated) to both bounce rate and time-on-site. Glenn Gabe had a nice post on “actual bounce rate” and why dwell time may represent an improvement over bounce rate. I’m also sticking to traditional user metrics from analytics and leaving out broader metrics, like site speed and social signals, which clearly tie into user behavior.

What I want you to do is to take a broader view of these user metrics, from the search engine’s perspective, and not get obsessed with the SEO impact of your analytics data. I’ve seen people removing and even manipulating GA tags lately, for fear of SEO issues, and what they usually end up doing is just destroying the reliability of their own data. I don’t think either Google or Bing are using direct analytics data, and even if they do down the road, they’ll probably combine that data with other factors.

You should create search snippets that drive clicks to relevant pages and build pages that make people stay on your site. At the end of the day, it sounds pretty obvious, and it’s good for both SEO and conversion. Specifically, think about the combo – driving clicks is useless (and probably even detrimental to SEO) if most of the people clicking immediately leave your site. Work to find the balance and to target relevant keywords that drive the right clicks.

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button